Achieving Linear or Quadratic Convergence on Piecewise Smooth Optimization Problems

نویسندگان

  • ANDREAS GRIEWANK
  • ANDREA WALTHER
چکیده

Many problems in machine learning involve objective functions that are piecewise smooth [7] due to the occurrence of absolute values mins and maxes in their evaluation procedures. See e.g. [8]. For such function we derived in [3] first order (KKT) and second order (SSC) optimality conditions, which can be checked on the basis of a local piecewise linearization [2] that can be computed in an AD like fashion, e.g. using ADOL-C or Tapenade. In that analysis, a key assumption on the local piecewise linearization was the Linear Independence Kink Qualification (LIKQ), a generalization of the Linear Independence Constraint Qualification (LICQ) known from smooth Nonlinear Optimization. A rather surprising consequence is that checking the optimality conditions is not at all combinatorial but can be done with a cubic effort like in the classical smooth case. Moreover, as we show here first under LIKQ with SSC the natural algorithm of successive piecewise linear optimization with a proximal term (SPLOP) achieves a linear rate of convergence. A version of SPLOP has already been implemented and tested in [4, 1]. Secondly, we observe that, even without any kink qualifications, local optimality of the nonlinear objective always requires local optimality of its piecewise linearization, and strict minimality of the latter is in fact equivalent to sharp minimality of the former. Moreover, we show that SPLOP will converge quadratically to such sharp minimizers, where the function exhibits linear growth. These results are independent of the particular function representation, and allow in particular duplications of switching variables and other intermediates. We note that the classical theory for subgradient [9] , proximal [6] and bundle [5] methods usually only yields convergence rates like 1/sqrt(k) or log(k)/k, where k is the iteration counter. Only for strongly convex functions a linear convergence rate can sometimes be established. Our assumptions LIKQ and SSC are certainly quite strong, but they do not require convexity, even locally, near a minimizer. In case of the Lasso problem min ||x||1+ρ||Ax−b|| our method coincides with ISTA as described in [6]. Our current implementation of SPLOPT allows the verification of the theoretical results mentioned above on the usual set of academic test problems. The number of outer iterations is usually extremely low compared to more established approaches. However, the setting up and solving the local piecewise linear problem is not yet adapted to large structured problems. In effect we have to solve a sequence of closely related, convex Quadratic Optimization Problems (QOP), while marching through a polyhedral decomposition of the variable domain. For several aspects, like the selection of the next polyhedron, the handling of the many locally redundant constraints, and the exploitation of sparsity there are obvious improvements, which we are currently exploring and implementing. We expect to present results at least on the Lasso problem [6] and fuzzy pattern trees as described in [8]. 1School of Mathematical Sciences and Information Technology, Yachaytech, Urcuqúı, Imbabura, Ecuador 2Institut für Mathematik, Universität Paderborn, Paderborn, Germany

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convex Optimal Control Problems with Smooth Hamiltonians

Optimal control problems with convex costs, for which Hamiltonians have Lipschitz continuous gradients, are considered. Examples of such problems, including extensions of the linear-quadratic regulator with hard and possibly state-dependent control constraints, and piecewise linear-quadratic penalties are given. Lipschitz continuous differentiability and strong convexity of the terminal cost ar...

متن کامل

Primal-Dual Projected Gradient Algorithms for Extended Linear-Quadratic Programming

Many large-scale problems in dynamic and stochastic optimization can be modeled with extended linear-quadratic programming, which admits penalty terms and treats them through duality. In general the objective functions in such problems are only piecewise smooth and must be minimized or maximized relative to polyhedral sets of high dimensionality. This paper proposes a new class of numerical met...

متن کامل

A first-order interior-point method for linearly constrained smooth optimization

We propose a first-order interior-point method for linearly constrained smooth optimization that unifies and extends first-order affine-scaling method and replicator dynamics method for standard quadratic programming. Global convergence and, in the case of quadratic program, (sub)linear convergence rate and iterate convergence results are derived. Numerical experience on simplex constrained pro...

متن کامل

Quadratic Stabilization and Control of Piecewise-Linear Systems

We consider analysis and controller synthesis of piecewise-linear systems. The method is based on constructing quadratic and piecewise-quadratic Lyapunov functions that prove stability and performance for the system. It is shown that proving stability and performance, or designing (state-feedback) controllers, can be cast as convex optimization problems involving linear matrix inequalities that...

متن کامل

Polynomial Methods for Separable Convex Optimization in Unimodular Linear Spaces with Applications

We consider the problem of minimizing a separable convex objective function over the linear space given by system Mx = 0 with M a totally unimodular matrix. In particular, this generalizes the usual minimum linear cost circulation and co-circulation problems in a network, and the problems of determining the Euclidean distance from a point to the perfect bipartite matching polytope and the feasi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017